The kernel mutual information
نویسندگان
چکیده
indeed, we demonstrate that the KGV can also be thought of as We introduce a new contrast function, the kemel mutual information (KMIj, to measure the degree of independence of continuous random variables. This contrast function provides an approximate upper bound on the mutual information, as measured near independence, and is based on a kernel density estimate of the mutual information between a discretised approximation of the continuous random variables. We show that Bach and Jordan’s kernel generalised variance (KGV) is also an upper bound on the same kernel density estimate, but is looser. Finally, we suggest that the addition of a regularising term in the KGV causes it to approach the KMI, which motivates the introduction of this regularisation.
منابع مشابه
Computationally Efficient Estimation of Squared-Loss Mutual Information with Multiplicative Kernel Models
Squared-loss mutual information (SMI) is a robust measure of the statistical dependence between random variables. The sample-based SMI approximator called least-squares mutual information (LSMI) was demonstrated to be useful in performing various machine learning tasks such as dimension reduction, clustering, and causal inference. The original LSMI approximates the pointwise mutual information ...
متن کاملKernel Methods for Measuring Independence
We introduce two new functionals, the constrained covariance and the kernel mutual information, to measure the degree of independence of random variables. These quantities are both based on the covariance between functions of the random variables in reproducing kernel Hilbert spaces (RKHSs). We prove that when the RKHSs are universal, both functionals are zero if and only if the random variable...
متن کاملKernel-based metric for performance evaluation of video infrared target tracking
A kernel-based metric measuring tracking reliability that is based on discriminative components of a kernel target model and kernel mutual information is presented. The discriminative components of the kernel target model are selected by computing the log-likelihood ratios of classconditional sample densities of these components from a target region and background sampled region. The components...
متن کاملA Kernel-Based Calculation of Information on a Metric Space
Kernel density estimation is a technique for approximating probability 1 distributions. Here, it is applied to the calculation of mutual information on a metric space. 2 This is motivated by the problem in neuroscience of calculating the mutual information 3 between stimuli and spiking responses; the space of these responses is a metric space. It 4 is shown that kernel density estimation on a m...
متن کاملMulti-sensor Image Fusion for Effective Night Vision through Contourlet Transform and KPCA and Mutual Information
This paper presents a new image fusion algorithm by combining contourlet transform with Kernel Principle Component Analysis to enhance perception in case of night vision applications. Contourlet Transform improves visual perception preserving the edge and texture information as compared to wavelet transform, while Kernel Principle Component Analysis helps to develop effective fusion decision ru...
متن کامل